Quantiles as Optimal Point Predictors
نویسنده
چکیده
The loss function plays a central role in the theory and practice of forecasting. If the loss is quadratic, the mean of the predictive distribution is the unique optimal point predictor. If the loss is linear, any median is an optimal point forecast. The title of the paper refers to the simple, possibly surprising fact that quantiles arise as optimal point predictors under a general class of economically relevant loss functions, to which we refer as generalized piecewise linear (GPL). The level of the quantile depends on a generic asymmetry parameter that reflects the possibly distinct costs of underprediction and overprediction. A loss function for which quantiles are optimal point predictors is necessarily GPL, similarly to the classical fact that a loss function for which the mean is optimal is necessarily of the Bregman type. We prove general versions of these results that apply on any decision-observation domain and rest on weak assumptions. The empirical relevance of the choices in the transition from the predictive distribution to the point forecast is illustrated on the Bank of England’s density forecasts of United Kingdom inflation rates, and probabilistic predictions of wind energy resources in the Pacific Northwest.
منابع مشابه
Optimal quantile level selection for disease classification and biomarker discovery with application to electrocardiogram data.
Classification with a large number of predictors and biomarker discovery become increasingly important in biological and medical research. This paper focuses on performing classification of cardiovascular diseases based on electrocardiogram analysis which deals with many variables and a lot of measurements within variables. We propose an optimal quantile level selection procedure to reduce dime...
متن کاملStructural Change Detection for Regression Quantiles under Time Series Non-stationarity
We consider quantile structural change testing for linear models with random designs and a wide class of non-stationary regressors and errors. New uniform Bahadur representations are established with nearly optimal approximation rates. Two cusum-type test statistics, one based on the regression coefficients and the other based on the gradient vectors are considered. Two of the most frequently u...
متن کاملQBoost: Predicting quantiles with boosting for regression and binary classification
0957-4174/$ see front matter 2011 Elsevier Ltd. A doi:10.1016/j.eswa.2011.06.060 ⇑ Tel.: +1 417 836 6037; fax: +1 417 836 6966. E-mail address: [email protected] In the framework of functional gradient descent/ascent, this paper proposes Quantile Boost (QBoost) algorithms which predict quantiles of the interested response for regression and binary classification. Quantile Boost Re...
متن کاملA comparison of approaches for stratifying on the propensity score to reduce bias.
RATIONALE, AIMS, AND OBJECTIVES Stratification is a popular propensity score (PS) adjustment technique. It has been shown that stratifying the PS into 5 quantiles can remove over 90% of the bias due to the covariates used to generate the PS. Because of this finding, many investigators partition their data into 5 quantiles of the PS without examining whether a more robust solution (one that incr...
متن کاملEfficient Robbins-Monro Procedure for Binary Data
The Robbins-Monro procedure does not perform well in the estimation of extreme quantiles, because the procedure is implemented using asymptotic results, which are not suitable for binary data. Here we propose a modification of the Robbins-Monro procedure and derive the optimal procedure for binary data under some reasonable approximations. The improvement obtained by using the optimal procedure...
متن کامل